1 research outputs found

    Interferometric analysis of an exploration seismic survey in northern Texas

    Get PDF
    M.S.University of Oklahoma, 2015.Includes bibliographical references (leaves 67-72).The extraction of Earth responses from seismic data without an active source has received more attention in the past decade than ever before. This growth in popularity is primarily due to the increased availability of computing capabilities required to process such data. Interferometry is the most common method of processing passive ambient data. Different methods of interferometric computation are compared in this study and a workflow for the interferogram with the most clarity is presented. Methods of normalization include ru1ming absolute mean, sign bit, and an automatic gain control (AGC) based on root mean squared (RMS) average. Interval lengths from I minute to 120 minutes are compared, and the differences between cross-correlation and cross coherence are examined. The final workflow uses running absolute mean normalization , cross-coherence, and a 30 minute interval length. Interferometry often deals with large amounts of data, greater than 17 terabytes in this case. Additionally, Central Processing Units (CPUs) and Graphical Processing Units (GPUs) are both used on each step of the workflow to find the most efficient hardware for each process. I analyzed the time cost associated with steps in interferometric computation and found CPUs operate faster on complicated normalizations and GPUs operate faster on simple normalizations and correlations. The workflow does not change based on these findings
    corecore